# Multi-domain corpus training
Kogpt J 350m
MIT
A Korean text generation model based on the GPT-J architecture with 350 million parameters, suitable for various Korean text generation tasks.
Large Language Model Korean
K
heegyu
123
7
Gpt2 Spanish
MIT
A language generation model trained on 11.5GB of Spanish text, using the same parameter configuration as OpenAI GPT-2 small version
Large Language Model Supports Multiple Languages
G
DeepESP
2,917
37
Featured Recommended AI Models